New class of limited - memory variationally - derived variable metric methods 1

نویسندگان

  • Jan Vlček
  • Ladislav Lukšan
چکیده

We present a new family of limited-memory variationally-derived variable metric (VM) line search methods with quadratic termination property for unconstrained minimization. Starting with x0 ∈ RN , VM line search methods (see [6], [3]) generate iterations xk+1 ∈ RN by the process xk+1 = xk + sk, sk = tkdk, where the direction vectors dk ∈ RN are descent, i.e. g k dk < 0, k ≥ 0, and the stepsizes tk > 0 satisfy f(xk+1)− f(xk) ≤ ε1tkg k dk, g k+1dk ≥ ε2g k dk, (1) k ≥ 0, with 0 < ε1 < 1/2 and ε1 < ε2 < 1, where f is an objective function, gk = ∇f(xk). We denote yk = gk+1 − gk, k ≥ 0 and by ‖.‖F the Frobenius matrix norm. We describe a new family in Section 1 and in Section 2 a correction formula, which uses the previous vectors sk−1, yk−1. Numerical results are presented in Section 3.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

New class of limited-memory variationally-derived variable metric methods

A new family of limited-memory variationally-derived variable metric or quasi-Newton methods for unconstrained minimization is given. The methods have quadratic termination property and use updates, invariant under linear transformations. Some encouraging numerical experience is reported.

متن کامل

Recursive formulation of limited memory variable metric methods

In this report we propose a new recursive matrix formulation of limited memory variable metric methods. This approach enables to approximate of both the Hessian matrix and its inverse and can be used for an arbitrary update from the Broyden class (and some other updates). The new recursive formulation requires approximately 4mn multiplications and additions for the direction determination, so i...

متن کامل

Generalizations of the limited-memory BFGS method based on the quasi-product form of update

Two families of limited-memory variable metric or quasi-Newton methods for unconstrained minimization based on quasi-product form of update are derived. As for the first family, four variants how to utilize the Strang recurrences for the Broyden class of variable metric updates are investigated; three of them use the same number of stored vectors as the limitedmemory BFGS method. Moreover, one ...

متن کامل

Limited-memory projective variable metric methods for unconstrained minimization

A new family of limited-memory variable metric or quasi-Newton methods for unconstrained minimization is given. The methods are based on a positive definite inverse Hessian approximation in the form of the sum of identity matrix and two low rank matrices, obtained by the standard scaled Broyden class update. To reduce the rank of matrices, various projections are used. Numerical experience is e...

متن کامل

Variable Metric Stochastic Approximation Theory

We provide a variable metric stochastic approximation theory. In doing so, we provide a convergence theory for a large class of online variable metric methods including the recently introduced online versions of the BFGS algorithm and its limited-memory LBFGS variant. We also discuss the implications of our results for learning from expert advice.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008